Dimension reduction and coefficient estimation in multivariate linear regression
نویسندگان
چکیده
We introduce a general formulation for dimension reduction and coefficient estimation in the multivariate linear model. We argue that many of the existing methods that are commonly used in practice can be formulated in this framework and have various restrictions. We continue to propose a new method that is more flexible and more generally applicable. The method proposed can be formulated as a novel penalized least squares estimate. The penalty that we employ is the coefficient matrix’s Ky Fan norm. Such a penalty encourages the sparsity among singular values and at the same time gives shrinkage coefficient estimates and thus conducts dimension reduction and coefficient estimation simultaneously in the multivariate linear model.We also propose a generalized cross-validation type of criterion for the selection of the tuning parameter in the penalized least squares. Simulations and an application in financial econometrics demonstrate competitive performance of the new method. An extension to the non-parametric factor model is also discussed.
منابع مشابه
Moment Based Dimension Reduction for Multivariate Response Regression
Dimension reduction aims to reduce the complexity of a regression without requiring a pre-specified model. In the case of multivariate response regressions, covariance-based estimation methods for the k-th moment based dimension reduction subspaces circumvent slicing and nonparametric estimation so that they are readily applicable to multivariate regression settings. In this article, the covari...
متن کاملJoint estimation of sparse multivariate regression and conditional graphical models
Multivariate regression model is a natural generalization of the classical univariate regression model for fitting multiple responses. In this paper, we propose a highdimensional multivariate conditional regression model for constructing sparse estimates of the multivariate regression coefficient matrix that accounts for the dependency structure among the multiple responses. The proposed method...
متن کاملConvex optimization methods for dimension reduction and coefficient estimation in multivariate linear regression
In this paper, we study convex optimization methods for computing the nuclear (or, trace) norm regularized least squares estimate in multivariate linear regression. The so-called factor estimation and selection (FES) method, recently proposed by Yuan et al. [25], conducts parameter estimation and factor selection simultaneously and have been shown to enjoy nice properties in both large and fini...
متن کاملA note on extension of sliced average variance estimation to multivariate regression
Rand Corporation, Pittsburgh, PA 15213 e-mail: [email protected] Abstract: Many sufficient dimension reduction methodologies for univariate regression have been extended to multivariate regression. Sliced average variance estimation (SAVE) has the potential to recover more reductive information, and recent development enables us to test the dimension and predictor effects with distributions comm...
متن کاملPenalized Bregman Divergence Estimation via Coordinate Descent
Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007